Current Issue : October - December Volume : 2017 Issue Number : 4 Articles : 5 Articles
Safety of human-robot physical interaction is enabled not only by suitable robot control strategies but also by suitable sensing\ntechnologies. For example, if distributed tactile sensors were available on the robot, they could be used not only to detect\nunintentional collisions, but also as human-machine interface by enabling a new mode of social interaction with the machine.\nStarting from their previous works, the authors developed a conformable distributed tactile sensor that can be easily conformed to\nthe different parts of the robot body. Its ability to estimate contact force components and to provide a tactile map with an accurate\nspatial resolution enables the robot to handle both unintentional collisions in safe human-robot collaboration tasks and intentional\ntoucheswhere the sensor is used as human-machine interface. In this paper, the authors present the characterization of the proposed\ntactile sensor and they show how it can be also exploited to recognize haptic tactile gestures, by tailoring recognition algorithms,\nwell known in the image processing field, to the case of tactile images. In particular, a set of haptic gestures has been defined to test\nthree recognition algorithms on a group of 20 users. The paper demonstrates how the same sensor originally designed to manage\nunintentional collisions can be successfully used also as human-machine interface....
This paper describes a novel strategy to Radio-Frequency Identification (RFID) tag detection\nfor humanââ?¬â??robot interaction (HRI) purposes. The anisotropic detection pattern of the RFID reader\nantenna is combined with a probabilistic algorithm to obtain a coarse angular position relative to the\nRFID reader that can be used, for example, for behavioral control based on proxemics areas around\nthe robot. The success rate achieved is suitable for HRI purposes. The paper presents experimental\nresults on a detection model for the reader....
An important aspect in Humanââ?¬â??Robot Interaction is responding to different kinds of touch\nstimuli. To date, several technologies have been explored to determine how a touch is perceived\nby a social robot, usually placing a large number of sensors throughout the robotââ?¬â?¢s shell. In this\nwork, we introduce a novel approach, where the audio acquired from contact microphones located\nin the robotââ?¬â?¢s shell is processed using machine learning techniques to distinguish between different\ntypes of touches. The system is able to determine when the robot is touched (touch detection),\nand to ascertain the kind of touch performed among a set of possibilities: stroke, tap, slap, and tickle\n(touch classification). This proposal is cost-effective since just a few microphones are able to cover\nthe whole robotââ?¬â?¢s shell since a single microphone is enough to cover each solid part of the robot.\nBesides, it is easy to install and configure as it just requires a contact surface to attach the microphone\nto the robotââ?¬â?¢s shell and plug it into the robotââ?¬â?¢s computer. Results show the high accuracy scores in\ntouch gesture recognition. The testing phase revealed that Logistic Model Trees achieved the best\nperformance, with an F-score of 0.81. The dataset was built with information from 25 participants\nperforming a total of 1981 touch gestures....
For task-centric humanââ?¬â??computer interactions, information display is a crucial element based on which users make their\ndecisions. Since providing more information may not always promote the userââ?¬â?¢s awareness of the situation, we investigate\nthe relationship between information availability and user decision-making characteristics by conducting an experiment\nin the form of a war-simulation game. The results show that different types of operators rely on different types of\ninformation in decision-making. A workflow of adaptive provision of information is also introduced using our conceptual\narchitecture for an adaptive user interface for task-centric in-vehicle applications. A pilot study presented in the article is\nan attempt to add user-oriented design to task-centric in-vehicle operations to meet different requirements of operator\ndecision-making....
This study aimed to explore the effects of dominant and compliant personalities, on both flow experience and the external\ncharacteristics of flow experience. A total of 48 participants were recruited to play an online game and subsequently asked to\nrecall the songs they had heard while they were playing the game. Eye blink rate was recorded.The results demonstrated that (1)\nthe participant was immersed in the game more if he/she was relatively dominant or noncompliant; (2) the perceptions about\nthe external environment declined remarkably while being in a flow state; and (3) eye blink rates decreased only when the flow\nhappened at the beginning of the game, rather than throughout the whole process. The results suggested that gamers who tend\nto be dominant or noncompliant were more likely to experience flow. Eye blink rate and perceptions of the external environment\ncould be objective indicators of flow experience....
Loading....